| 1. | The Chapman Robbins bound also holds under much weaker regularity conditions.
|
| 2. | This makes the regularity conditions unnecessary as Baire measures are automatically regular.
|
| 3. | Then, if regularity conditions are satisfied,
|
| 4. | Which expresses the strong regularity condition.
|
| 5. | Under mild regularity conditions this process converges on maximum likelihood ( or maximum posterior ) values for parameters.
|
| 6. | For the variations and restricted by the constraints ( assuming the constraints satisfy some regularity conditions ) is generally
|
| 7. | Note that a limit distribution need not exist : this requires regularity conditions on the tail of the distribution.
|
| 8. | As a consequence, the rate of convergence of the Gauss Newton algorithm can be quadratic under certain regularity conditions.
|
| 9. | It has been shown that under certain regularity conditions, learnable classes and uniformly Glivenko-Cantelli classes are equivalent.
|
| 10. | The method can converge much faster though, with an order which approaches 2 provided that f satisfies the regularity conditions described below.
|